Neural Network

  • Cost Function
  • Back-propagation
  • Optimization Objective
  • Function Fit

Cost Function

The definition of cost function for Neural Network is: $$ J(\Theta) = -\frac{1}{m} \left[ \sum_{i=1}^{m} \sum_{k=1}^{K} y^{(i)}_k log(h_{\Theta}(x^{(i)}))_k + (1-y^{(i)}_k) log(1-(h_{\Theta}(x^{(i)}))_k) \right] + \frac {\lambda}{2m} \sum_{l=1}^{L-1} \sum_{i=1}^{S_l} \sum_{j=1}^{S_{l+1}} (\Theta_{ji}^{(l)})^2 $$

where $S_j^{(l)}$ is the "error" of node $j$ in layer $l$.

Propagation

Forward propagation

The process: $$ x_1^{(1)} \xrightarrow{\Theta_1} \{a^{(2)}\} $$

Back propagation

Gradient Checking


In [ ]: